Search results for "emotion recognition"
showing 10 items of 28 documents
O4.8. CAN YOU SPOT EMOTIONS? FACIAL EMOTION RECOGNITION AND GENETIC RISK FOR PSYCHOSIS
2019
Background Facial emotion recognition (FER) is a key component of social cognition which has been found consistently impaired in schizophrenia. Deficits in global facial affect recognition have been also found in First Episode Psychosis (FEP) with the same severity as at further stages, especially for anger recognition. Literature to date has shown intermediate emotion recognition ability in either people with family history for psychotic disorders and unaffected relatives of psychotic patients, in a continuum between patients and healthy controls. Furthermore, Polygenic Risk Score (PRS) for schizophrenia has been found associated with social cognition, especially with facial emotion identi…
Affective matching of odors and facial expressions in infants: shifting patterns between 3 and 7 months.
2016
Recognition of emotional facial expressions is a crucial skill for adaptive behavior. Past research suggests that at 5 to 7 months of age, infants look longer to an unfamiliar dynamic angry/happy face which emotionally matches a vocal expression. This suggests that they can match stimulations of distinct modalities on their emotional content. In the present study, olfaction-vision matching abilities were assessed across different age groups (3, 5 and 7 months) using dynamic expressive faces (happy vs. disgusted) and distinct hedonic odor contexts (pleasant, unpleasant and control) in a visual-preference paradigm. At all ages the infants were biased toward the disgust faces. This visual bias…
Eigenexpressions: Emotion Recognition Using Multiple Eigenspaces
2013
This paper presents an appearance-based holistic method for expression recognition. A two stage supervised learning approach is used. At the first stage, training images are used to compute one subspace per expression. At the second stage, the same images are used to train a classifier. In this step, Euclidean distances from each image to each particular subspace are used as the input to the classifier. The resulting system significantly outperforms the baseline eigenfaces method on the Cohn-Kanade data set, with performance gains in the range 10%-20%.
First impression misleads emotion recognition
2019
Recognition of others' emotions is a key life ability that guides one's own choices and behavior, and it hinges on the recognition of others' facial cues. Independent studies indicate that facial appearance-based evaluations affect social behavior, but little is known about how facial appearance-based trustworthiness evaluations influence the recognition of specific emotions. We tested the hypothesis that first impressions based on facial appearance affect the recognition of basic emotions. A total of 150 participants completed a dynamic emotion recognition task. In a within-subjects design, the participants viewed videos of individuals with trustworthy-looking, neutral, or untrustworthy-lo…
Feature Extraction and Selection for Pain Recognition Using Peripheral Physiological Signals.
2019
In pattern recognition, the selection of appropriate features is paramount to both the performance and the robustness of the system. Over-reliance on machine learning-based feature selection methods can, therefore, be problematic; especially when conducted using small snapshots of data. The results of these methods, if adopted without proper interpretation, can lead to sub-optimal system design or worse, the abandonment of otherwise viable and important features. In this work, a deep exploration of pain-based emotion classification was conducted to better understand differences in the results of the related literature. In total, 155 different time domain and frequency domain features were e…
Age, gender, and puberty influence the development of facial emotion recognition
2015
Our ability to differentiate between simple facial expressions of emotion develops between infancy and early adulthood, yet few studies have explored the developmental trajectory of emotion recognition using a single methodology across a wide age-range. We investigated the development of emotion recognition abilities through childhood and adolescence, testing the hypothesis that children’s ability to recognise simple emotions is modulated by chronological age, pubertal stage and gender. In order to establish norms, we assessed 478 children aged 6-16 years, using the Ekman-Friesen Pictures of Facial Affect. We then modelled these cross-sectional data in terms of competence in accurate recogn…
Exploiting Correlation between Body Gestures and Spoken Sentences for Real-time Emotion Recognition
2017
Humans communicate their affective states through different media, both verbal and non-verbal, often used at the same time. The knowledge of the emotional state plays a key role to provide personalized and context-related information and services. This is the main reason why several algorithms have been proposed in the last few years for the automatic emotion recognition. In this work we exploit the correlation between one's affective state and the simultaneous body expressions in terms of speech and gestures. Here we propose a system for real-time emotion recognition from gestures. In a first step, the system builds a trusted dataset of association pairs (motion data -> emotion pattern), a…
Ensemble of Hankel Matrices for Face Emotion Recognition
2015
In this paper, a face emotion is considered as the result of the composition of multiple concurrent signals, each corresponding to the movements of a specific facial muscle. These concurrent signals are represented by means of a set of multi-scale appearance features that might be correlated with one or more concurrent signals. The extraction of these appearance features from a sequence of face images yields to a set of time series. This paper proposes to use the dynamics regulating each appearance feature time series to recognize among different face emotions. To this purpose, an ensemble of Hankel matrices corresponding to the extracted time series is used for emotion classification withi…
Body Gestures and Spoken Sentences: A Novel Approach for Revealing User’s Emotions
2017
In the last decade, there has been a growing interest in emotion analysis research, which has been applied in several areas of computer science. Many authors have con- tributed to the development of emotion recognition algorithms, considering textual or non verbal data as input, such as facial expressions, gestures or, in the case of multi-modal emotion recognition, a combination of them. In this paper, we describe a method to detect emotions from gestures using the skeletal data obtained from Kinect-like devices as input, as well as a textual description of their meaning. The experimental results show that the correlation existing between body movements and spoken user sentence(s) can be u…
The coupling between face and emotion recognition from early adolescence to young adulthood
2020
Abstract In the present study, we investigated whether differentiation occurs between identity and emotion processing as development in both domains proceeds across adolescence and during the transition into young adulthood. A sample of 343 participants between 11 and 24 years performed the Glasgow Face Matching Task ( Burton, White, & McNeill, 2010 ) for identity-based face recognition and the Cambridge Face-Battery ( Golan, Baron-Cohen, & Hill, 2006 ) for complex emotion recognition. Our results show adult levels of face recognition by the end of early adolescence, while complex emotion recognition continues to develop into young adulthood. Although each ability matures at different rate,…